Optimality of 1-norm regularization among weighted 1-norms for sparse recovery: a case study on how to find optimal regularizations

نویسندگان

  • Yann Traonmilin
  • Samuel Vaiter
چکیده

The 1-norm was proven to be a good convex regularizer for the recovery of sparse vectors from under-determined linear measurements. It has been shown that with an appropriate measurement operator, a number of measurements of the order of the sparsity of the signal (up to log factors) is sufficient for stable and robust recovery. More recently, it has been shown that such recovery results can be generalized to more general low-dimensional model sets and (convex) regularizers. These results lead to the following question: to recover a given low-dimensional model set from linear measurements, what is the “best” convex regularizer? To approach this problem, we propose a general framework to define several notions of “best regularizer” with respect to a low-dimensional model. We show in the minimal case of sparse recovery in dimension 3 that the 1-norm is optimal for these notions. However, generalization of such results to the n-dimensional case seems out of reach. To tackle this problem, we propose looser notions of best regularizer and show that the 1-norm is optimal among weighted 1-norms for sparse recovery within this framework.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

روش‌های تجزیه مقادیر منفرد منقطع و تیخونوف تعمیم‌یافته در پایدارسازی مسئله انتقال به سمت پائین

The methods applied to regularization of the ill-posed problems can be classified under “direct” and “indirect” methods. Practice has shown that the effects of different regularization techniques on an ill-posed problem are not the same, and as such each ill-posed problem requires its own investigation in order to identify its most suitable regularization method. In the geoid computations witho...

متن کامل

Continuous-Domain Solutions of Linear Inverse Problems with Tikhonov vs. Generalized TV Regularization

We consider linear inverse problems that are formulated in the continuous domain. The object of recovery is a function that is assumed to minimize a convex objective functional. The solutions are constrained by imposing a continuousdomain regularization. We derive the parametric form of the solution (representer theorems) for Tikhonov (quadratic) and generalized total-variation (gTV) regulariza...

متن کامل

Sparse Adaptive Iteratively-Weighted Thresholding Algorithm (SAITA) for Lp-Regularization Using the Multiple Sub-Dictionary Representation

Both L1/2 and L2/3 are two typical non-convex regularizations of Lp (0<p<1), which can be employed to obtain a sparser solution than the L₁ regularization. Recently, the multiple-state sparse transformation strategy has been developed to exploit the sparsity in L₁ regularization for sparse signal recovery, which combines the iterative reweighted algorithms. To further exploit the sparse structu...

متن کامل

Inverse Problem Regularization with Weak Decomposable Priors. Part I: Recovery Guarantees

This first talk is dedicated to assessing the theoretical recovery performance of this class of regularizers. We consider regularizations with convex positively 1homogenous functionals (in fact gauges) which obey a weak decomposability property. The weak decomposability will promote solutions of the inverse problem conforming to some notion of simplicity/low complexity by living on a low dimens...

متن کامل

Unifying Framework for Fast Learning Rate of Non-Sparse Multiple Kernel Learning

In this paper, we give a new generalization error bound of Multiple Kernel Learning (MKL) for a general class of regularizations. Our main target in this paper is dense type regularizations including lp-MKL that imposes lp-mixed-norm regularization instead of l1-mixed-norm regularization. According to the recent numerical experiments, the sparse regularization does not necessarily show a good p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018